Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
The opaque relationship between biology and behavior is an intractable problem for psychiatry, and it increasingly challenges longstanding diagnostic categorizations. While various big data sciences have been repeatedly deployed as potential solutions, they have so far complicated more than they have managed to disentangle. Attending to categorical misalignment, this article proposes one reason why this is the case: Datasets have to instantiate clinical categories in order to make biological sense of them, and they do so in different ways. Here, I use mixed methods to examine the role of the reuse of big data in recent genomic research on autism spectrum disorder (ASD). I show how divergent regimes of psychiatric categorization are innately encoded within commonly used datasets from MSSNG and 23andMe, contributing to a rippling disjuncture in the accounts of autism that this body of research has produced. Beyond the specific complications this dynamic introduces for the category of autism, this paper argues for the necessity of critical attention to the role of dataset reuse and recombination across human genomics and beyond.more » « less
-
Recent reporting has revealed that the UK Biobank (UKB)—a large, publicly-funded research database containing highly-sensitive health records of over half a million participants—has shared its data with private insurance companies seeking to develop actuarial AI systems for analyzing risk and predicting health. While news reports have characterized this as a significant breach of public trust, the UKB contends that insurance research is “in the public interest,” and that all research participants are adequately protected from the possibility of insurance discrimination via data de-identification. Here, we contest both of these claims. Insurers use population data to identify novel categories of risk, which become fodder in the production of black-boxed actuarial algorithms. The deployment of these algorithms, as we argue, has the potential to increase inequality in health and decrease access to insurance. Importantly, these types of harms are not limited just to UKB participants: instead, they are likely to proliferate unevenly across various populations within global insurance markets via practices of profiling and sorting based on the synthesis of multiple data sources, alongside advances in data analysis capabilities, over space/time. This necessitates a significantly expanded understanding of the publics who must be involved in biobank governance and data-sharing decisions involving insurers.more » « less
-
Within the ongoing disruption of the COVID-19 pandemic, technologically mediated health surveillance programs have vastly intensified and expanded to new spaces. Popular understandings of medical and health data protections came into question as a variety of institutions introduced new tools for symptom tracking, contact tracing, and the management of related data. These systems have raised complex questions about who should have access to health information, under what circumstances, and how people and institutions negotiate relationships between privacy, public safety, and care during times of crisis. In this paper, we take up the case of a large public university working to keep campus productive during COVID-19 through practices of placemaking, symptom screeners, and vaccine mandate compliance databases. Drawing on a multi-methods study including thirty-eight interviews, organizational documents, and discursive analysis, we show where and for whom administrative care infrastructures either misrecognized or torqued (Bowker and Star 1999) the care relationships that made life possible for people in the university community. We argue that an analysis of care—including the social relations that enable it and those that attempt to hegemonically define it—opens important questions for how people relate to data they produce about their bodies as well as to the institutions that manage them. Furthermore, we argue that privacy frameworks that rely on individual rights, essential categories of “sensitive information,” or the normative legitimacy of institutional practices are not equipped to reveal how people negotiate privacy and care in times of crisis.more » « less
An official website of the United States government
